Straightforward intermediate rank tensor product smoothing in mixed models
نویسندگان
چکیده
Tensor product smooths provide the natural way of representing smooth interaction terms in regression models because they are invariant to the units in which the covariates are measured, hence avoiding the need for arbitrary decisions about relative scaling of variables. They would also be the natural way to represent smooth interactions in mixed regression models, but for the fact that the tensor product constructions proposed to date are difficult or impossible to estimate using most standard mixed modelling software. This paper proposes a new approach to the construction of tensor product smooths, which allows the smooth to to be written as the sum of some fixed effects and some sets of i.i.d. Gaussian random effects: no previously published construction achieves this. Because of the simplicity of this random effects structure, our construction is useable with almost any flexible mixed modelling software, allowing smooth interaction terms to be readily incorporated into any Generalized Linear Mixed Model. To achieve the computationally convenient separation of smoothing penalties, the construction differs from previous tensor product approaches in the penalties used to control smoothness, but the penalties have the advantage over several alternative approaches of being explicitly interpretable in terms of function shape. Like all tensor product smoothing methods, our approach builds up smooth functions of several variables from marginal smooths of lower dimension, but unlike much of the previous literature we treat the general case in which the marginal smooths can be any quadratically penalized basis expansion, and there can be any number of them. We also point out that the imposition of identifiability constraints on smoothers requires more care in the mixed model setting than it would in a simple additive model setting, and show how to deal with the issue. An interesting side effect of our construction is that an ANOVA-decomposition of the ∗Mathematical Sciences, University of Bath, Bath BA2 7AY U.K. [email protected] †Department of Statistics, LMU, München, Germany
منابع مشابه
Low-rank scale-invariant tensor product smooths for generalized additive mixed models.
A general method for constructing low-rank tensor product smooths for use as components of generalized additive models or generalized additive mixed models is presented. A penalized regression approach is adopted in which tensor product smooths of several variables are constructed from smooths of each variable separately, these "marginal" smooths being represented using a low-rank basis with an...
متن کاملFast and compact smoothing on large multidimensional grids
A framework of penalized generalized linear models and tensor products of B-splines with roughness penalties allows effective smoothing of data in multidimensional arrays. A straightforward application of the penalized Fisher scoring algorithm quickly runs into storage and computational difficulties. A novel algorithm takes advantage of the special structure of both the data as an array and the...
متن کاملReverse Engineering Point Clouds to Fit Tensor Product B-Spline Surfaces by Blending Local Fits
Being able to reverse engineer from point cloud data to obtain 3D models is important in modeling. As our main contribtion, we present a new method to obtain a tensor product B-spline representation from point cloud data by fitting surfaces to appropriately segmented data. By blending multiple local fits our method is more efficient than existing techniques with the ability to deal with a detai...
متن کاملBackfitting in Smoothing Spline Anova
A computational scheme for fitting smoothing spline ANOVA models to large data sets with a (near) tensor product design is proposed. Such data sets are common in spatial-temporal analyses. The proposed scheme uses the backfitting algorithm to take advantage of the tensor product design to save both computational memory and time. Several ways to further speed up the backfitting algorithm, such a...
متن کاملA Newton-Grassmann Method for Computing the Best Multilinear Rank-(r1, r2, r3) Approximation of a Tensor
We derive a Newton method for computing the best rank-(r1, r2, r3) approximation of a given J × K × L tensor A. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton’s method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Statistics and Computing
دوره 23 شماره
صفحات -
تاریخ انتشار 2013